VC dimension theory for a learning system with forgetting

نویسنده

  • Shotaro Akaho
چکیده

In a changing environment, forgetting old samples is an e ective method to improve the adaptability of learning systems. However, too fast forgetting causes a decrease of generalization performance. In this paper, we analyze the generalization performance of a learning system with a forgetting parameter. For a class of binary discriminant functions, it is proved that the generalization error is given by O( p h ) (O(h ) in a certain case), where h is the VC dimension of the class of functions and 1 represents a forgetting rate. The result provides a criterion to determine the optimal forgetting rate.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...

متن کامل

A Capacity Scaling Law for Artificial Neural Networks

By assuming an ideal neural network with gating functions handling the worst case data, we derive the calculation of two critical numbers predicting the behavior of perceptron networks. First, we derive the calculation of what we call the lossless memory (LM) dimension. The LM dimension is a generalization of the Vapnik–Chervonenkis (VC) dimension that avoids structured data and therefore provi...

متن کامل

Popper, Falsification and the VC-dimension

We compare Sir Karl Popper’s ideas concerning the falsifiability of a theory with similar notions from VC-theory. Having located some divergences, we discuss how best to view Popper’s work from the perspective of statistical learning theory.

متن کامل

Higher dimensional PAC - learning and VC - dimension

The VC-dimension (Vapnic-Chervonenkis dimension) was introduced in 1970’s related to computational learning theory, combinatorics, and model theory which is a branch of mathematical logic. In fact, it is well known that for given class C, PAC-learnability of C, the finiteness of VC-dimension of C, and the dependency (which is a notion in model theory) of a formula defines C are essentially the ...

متن کامل

Lecture 5: Rademacher Complexity

Last time we introduced the VC dimension and saw one of the fundamental results in statistical learning theory. Recall that for a hypothesis space H : X → {0, 1}, we say that H shatters a sample C ⊂ X if the H can realize all possible binary labelings of C. The VC dimension is the size of the largest set C that can be shattered by H. For binary classification with 0/1 loss, if the hypothesis sp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007